Annals of Emerging Technologies in Computing (AETiC) |
|
|
Paper #1
|
A Study of Prediction Accuracy of English Test Performance Using Data Mining and Analysis
Yujie Duan
Abstract: This paper focused on the effect of data mining in predicting students' English test scores. With the progress of data mining analysis, there are more applications in teaching, and data mining to achieve the prediction of students’ test scores is important to support the educational work. In this paper, the C4.5 decision tree algorithm was improved by combining Taylor's series, and then the data of students' English tests in 2019-2020 were collected for experiments. The results showed that the scores of “Comprehensive English” and “Specialized English” had a great influence on the score of CET-4, and the improved C4.5 algorithm was more efficient than the original one, maintained a fast computation speed even when the data volume was large, and had an accuracy of more than 85%. The results demonstrate the accuracy of the improved C4.5 algorithm for predicting students’ English test scores. The improved C4.5 algorithm can be extended and used in reality.
Keywords: College English Test-4; Data mining; Decision tree; English test; Score prediction.
Download Full Text
|
|
Paper #2
|
Enhancing Feature Extraction Technique Through Spatial Deep Learning Model for Facial Emotion Detection
Nizamuddin Khan, Ajay Singh and Rajeev Agrawal
Abstract: Automatic facial expression analysis is a fascinating and difficult subject that has implications in a wide range of fields, including human–computer interaction and data-driven approaches. Based on face traits, a variety of techniques are employed to identify emotions. This article examines various recent explorations into automatic data-driven approaches and handcrafted approaches for recognising face emotions. These approaches offer computationally complex solutions that provide good accuracy when training and testing are conducted on the same datasets, but they perform less well on the most difficult realistic dataset, FER-2013. The article's goal is to present a robust model with lower computational complexity that can predict emotion classes more accurately than current methods and aid society in finding a realistic, all-encompassing solution for the facial expression system. A crucial step in good facial expression identification is extracting appropriate features from the face images. In this paper, we examine how well-known deep learning techniques perform when it comes to facial expression recognition and propose a convolutional neural network-based enhanced version of a spatial deep learning model for the most relevant feature extraction with less computational complexity. That gives a significant improvement on the most challenging dataset, FER-2013, which has the problems of occlusions, scale, and illumination variations, resulting in the best feature extraction and classification and maximizing the accuracy, i.e., 74.92%. It also maximizes the correct prediction of emotions at 99.47%, and 98.5% for a large number of samples on the CK+ and FERG datasets, respectively. It is capable of focusing on the major features of the face and achieving greater accuracy over previous fashions.
Keywords: Convolutional neural network; Facial expression recognition; Spatial deep learning; Spatial transform network.
Download Full Text
|
|
Paper #3
|
Monte Carlo Simulation of Cone X-ray Beam and Dose Scoring on Voxel Phantom with Open Source Software EGSnrcmp
Nikolaos Chatzisavvas, Dimitrios Nikolopoulos, Georgios Priniotakis, Ioannis Valais, Thanasis Koustas and Georgios Karpetas
Abstract: Radiation is used nowadays for inspection, therapy, food safety, and diagnostic purposes. Our daily lives include the use of devices like airport scanners, projectional radiographers, CT scanners, treatment heads, cargo inspection systems, etc. However, these systems are extremely complicated and cost a significant amount of money to study, maintain and conduct research with. Monte Carlo is the ideal method for simulating such systems successfully and achieving findings that are remarkably comparable to experimental methods. Simulation software, however, is not always free, open source, and accessible to everyone. Open source software has gained popularity in the technological age that best represents the period we are living in, and practically all significant software sectors now use open source software tools. With the aid of an open-source, thoroughly validated software, called EGSnrcmp we were able to describe an abstract model-geometry of a cone-beam computed tomography X-rays source, produce patient-specific phantoms and score dosage values based on characteristics of the cone beam source. We outline the necessary methods and provide useful details about how to conduct such studies inside the software's ecosystem. Our study focuses on the relationship between the cone-beam source's field of view (FOV) and its impact on patient dosage, by emulating a CBCT examination. To characterize our cbct source, we employed stainless steel material to build the collimator and tungsten (W) material to build the anode. The most frequent energy at which these tests are conducted is 100 keV, which is the energy of the electrons we utilize. We were able to score absorbed dosage within a phantom produced from dicom images of a real patient, demonstrate the relationship between the FOV of the beam and the absorbed dosage and verify the cbct source using theoretical values.
Keywords: Medical Imaging; Monte Carlo; Open-source; Simulation; Software; Spectral distribution.
Download Full Text
|
|
Paper #4
|
Weighted Sum Metrics – Based Load Balancing RPL Objective Function for IoT
Poorana Senthilkumar Subramani and Subramani Bojan
Abstract: The technological development of Internet of Things (IoT) applications is emerging and attracting the attention of the real world in the automated industry, agriculture, environment, and scientific community. In most scenarios, extending the network lifetime of an IoT network is highly challenging because of constrained nodes. The wireless sensor network (WSN) is the core component of IoT applications. In addition, the WSN nodes are required for the network processes, particularly routing, energy maintenance, load balance, congestion control, packet delivery, quick response, and more. The failure of any of the above network processes will affect the entire network operation. IPv6 Routing Protocol for Low-power and Lossy network (RPL) provides high routing solutions to IoT applications requirements. The load balance, congestion control, traffic load, and bottleneck problems are still open issues in the RPL. To resolve the load balance issue, we propose a weighted sum method objective function (WSM-OF), which provides the ability to select the alternative parent in routing by RPL metrics. WSM-OF adopts congestion control and load balancing to avoid heavy traffic and extend the network's node lifetime. The network parameters of control overhead, jitter, packet delivery ratio, parent switching, energy consumption, latency, and network lifetime are implemented and analyzed through the COOJA simulator. The result shows that the WSM-OF improves the network performance and significantly enhances the network lifetime by up to 7.8%.
Keywords: IoT; LLNs; Load Balance; Objective Functions; Routing Metrics; RPL; WSM-OF.
Download Full Text
|
|
Paper #5
|
A General Architecture for a Trustworthy Creditworthiness-Assessment Platform in the Financial Domain
Giandomenico Cornacchia, Vito W. Anelli, Fedelucio Narducci, Azzurra Ragone and Eugenio Di Sciascio
Abstract: The financial domain is making huge advancements thanks to the exploitation of artificial intelligence. As an example, the credit-worthiness-assessment task is now strongly based on Machine Learning algorithms that make decisions independently from humans. Several studies showed remarkable improvement in reliability, customer care, and return on investment. Nonetheless, many users remain sceptical since they perceive the whole as only partially transparent. The trust in the system decision, the guarantee of fairness in the decision-making process, the explanation of the reasons behind the decision are just some of the open challenges for this task. Moreover, from the financial institution's perspective, another compelling problem is credit-repayment monitoring. Even here, traditional models (e.g., credit scorecards) and machine learning models can help the financial institution in identifying, at an early stage, customers that will fall into default on payments. The monitoring task is critical for the debt-repayment success of identifying bad debtors or simply users who are momentarily in difficulty. The financial institution can thus prevent possible defaults and, if possible, meet the debtor's needs. In this work, the authors propose an architecture for a Creditworthiness-Assessment duty that can meet the transparency needs of the customers while monitoring the credit-repayment risk. This preliminary study carried out an experimental evaluation of the component devoted to the credit-score computation and monitoring credit repayments. The study shows that the authors’ architecture can be an effective tool to improve current Credit-scoring systems. Combining a static and a subsequent dynamic approach can correct mistakes made in the first phase and foil possible false positives for good creditors.
Keywords: Credit Scoring; Early Warning; Explainability; Fairness.
Download Full Text
|
International Association for Educators and Researchers (IAER), registered in England and Wales - Reg #OC418009 Copyright © IAER 2019 | |
|